A sum-of-product neural network (SOPNN)

نویسندگان

  • Chun-Shin Lin
  • Chien-Kuo Li
چکیده

This paper presents a sum-of-product neural network (SOPNN) structure. The SOPNN can learn to implement static mapping that multilayer neural networks and radial basis function networks normally perform. The output of the neural network has the sum-of-product form +Np i/1 <Nv j/1 f ij (x j ), where x j 's are inputs, N v is the number of inputs, f ij ( ) is a function generated through network training, and N p is the number of product terms. The function f ij (x j ) can be expressed as + k w ijk B jk (x j ), where B jk ( ) is a single-variable basis function and= ijk 's are weight values. Linear memory arrays can be used to store the weights. If B jk ( ) is a Gaussian function, the new neural network degenerates to a Gaussian function network. This paper focuses on the use of overlapped rectangular pulses as the basis functions. With such basis functions, = ijk B jk (x j ) will equal either zero or = ijk , and the computation of f ij (x j ) becomes a simple addition of some retrieved= ijk 's. The structure can be viewed as a basis function network with a #exible form for the basis functions. Learning can start with a small set of submodules and have new submodules added when it becomes necessary. The new neural network structure demonstrates excellent learning convergence characteristics and requires small memory space. It has merits over multilayer neural networks, radial basis function networks and CMAC in function approximation and mapping in high-dimensional input space. The technique has been tested for function approximation, prediction of a time series, learning control, and classi"cation. ( 2000 Elsevier Science B.V. All rights reserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Uniform Approximation Capabilities of Sum-of-Product and Sigma-Pi-Sigma Neural Networks

Investigated in this paper are the uniform approximation capabilities of sum-of-product (SOPNN) and sigma-pi-sigma (SPSNN) neural networks. It is proved that the set of functions that are generated by an SOPNN with its activation function in C(R) is dense in C(K) for any compact K ∈ R , if and only if the activation function is not a polynomial. It is also shown that if the activation function ...

متن کامل

A Design of EA-based Self-Organizing Polynomial Neural Networks using Evolutionary Algorithm for Nonlinear System Modeling

We discuss a new design methodology of self-organizing approximator technique (selforganizing polynomial neural networks (SOPNN)) using evolutionary algorithm (EA). The SOPNN dwells on the ideas of group method of data handling. The performances of SOPNN depend strongly on the number of input variables available to the model, the number of input variables and type (order) of the polynomials to ...

متن کامل

Self-Organizing Polynomial Neural Networks Based on Genetically Optimized Multi-Layer Perceptron Architecture 423 Self-Organizing Polynomial Neural Networks Based on Genetically Optimized Multi-Layer Perceptron Architecture

In this paper, we introduce a new topology of Self-Organizing Polynomial Neural Networks (SOPNN) based on genetically optimized Multi-Layer Perceptron (MLP) and discuss its comprehensive design methodology involving mechanisms of genetic optimization. Let us recall that the design of the “conventional” SOPNN uses the extended Group Method of Data Handling (GMDH) technique to exploit polynomials...

متن کامل

Self-Organizing Polynomial Neural Network for Modelling Complex Hydrological Processes

Artificial neural networks (ANNs) have been used increasingly for modelling com-plex hydrological processes. In this paper, we present a self-organizing polynomial neural network (SOPNN) algorithm, which combines the theory of bio-cybernetic self-organizing polynomial (SOP) with the artificial neural network (ANN) approach. With the SOP feature of seeking the best combination of polynomial mode...

متن کامل

Optimization of self-organizing polynomial neural networks

0957-4174/$ see front matter 2013 Elsevier Ltd. A http://dx.doi.org/10.1016/j.eswa.2013.01.060 ⇑ Tel.: +385 1 4561191. E-mail address: [email protected] The main disadvantage of self-organizing polynomial neural networks (SOPNN) automatically structured and trained by the group method of data handling (GMDH) algorithm is a partial optimization of model weights as the GMDH algorithm optimizes on...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neurocomputing

دوره 30  شماره 

صفحات  -

تاریخ انتشار 2000